# High-dimensional semantic encoding
Mmlw Retrieval Roberta Large V2
MMLW is a neural text encoder for Polish, optimized for information retrieval tasks, capable of converting queries and paragraphs into 1024-dimensional vectors.
Text Embedding Other
M
sdadas
2,091
1
Dense Encoder Msmarco Distilbert Word2vec256k
A sentence encoder based on msmarco-word2vec256000-distilbert-base-uncased, using a word2vec-initialized 256k vocabulary, specifically designed for sentence similarity tasks
Text Embedding
Transformers

D
vocab-transformers
38
0
Featured Recommended AI Models